AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Mistral architecture optimization

# Mistral architecture optimization

Wanabi 24b V1 GGUF
Apache-2.0
A large-scale language model fine-tuned specifically for Japanese novel writing support
Large Language Model Japanese
W
kawaimasa
274
2
Cydonia V1.2 Magnum V4 22B
Other
A 22B-parameter language model merged from Cydonia-22B-v1.2 and Magnum-v4-22b using the SLERP method
Large Language Model Transformers
C
knifeayumu
52
18
Zion Alpha Instruction Tuned SLERP
Apache-2.0
Zion_Alpha_Instruction_Tuned_SLERP is an innovative language model trained on Hebrew, which performs excellently in tasks such as sentiment analysis and natural language inference.
Large Language Model Transformers Supports Multiple Languages
Z
SicariusSicariiStuff
3,180
2
Japanese Starling ChatV 7B
Apache-2.0
A 7-billion-parameter Japanese dialogue model developed based on chatntq-ja-7b-v1.0, with original architecture derived from Mistral-7B-v0.1
Large Language Model Transformers Japanese
J
TFMC
88
7
Mistral 7B German Assistant V4
Alpaca-style version fine-tuned for German instructions and dialogues, supporting 8k tokens context length
Large Language Model Transformers German
M
flozi00
69
11
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase